Performance Analysis of the Gradient Comparator LMS Algorithm

نویسندگان

  • Bijit Kumar Das
  • Mrityunjoy Chakraborty
چکیده

The sparsity-aware zero attractor least mean square (ZA-LMS) algorithm manifests much lower misadjustment in strongly sparse environment than its sparsity-agnostic counterpart, the least mean square (LMS), but is shown to perform worse than the LMS when sparsity of the impulse response decreases. The reweighted variant of the ZA-LMS, namely RZALMS shows robustness against this variation in sparsity, but at the price of increased computational complexity. The other variants such as the l0LMS and the improved proportionate normalized LMS (IPNLMS), though perform satisfactorily, are also computationally intensive. The gradient comparator LMS (GC-LMS) is a practical solution of this trade-off when hardware constraint is to be considered. In this paper, we analyse the mean and the mean square convergence performance of the GC-LMS algorithm in detail. The analyses satisfactorily match with the simulation results.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Gradient Compared Lp-LMS Algorithms for Sparse System Identification

In this paper, we propose two novel p-norm penalty least mean square (lp-LMS) algorithms as supplements of the conventional lp-LMS algorithm established for sparse adaptive filtering recently. A gradient comparator is employed to selectively apply the zero attractor of p-norm constraint for only those taps that have the same polarity as that of the gradient of the squared instantaneous error, w...

متن کامل

Digital LMS Adaptation of Analog Filters Without Gradient Information

The least mean square (LMS) algorithm has practical problems in the analog domain mainly due to dc offset effects. If digital LMS adaptation is used, a digitizer (analog-to-digital converter or comparator) is required for each gradient signal as well as the filter output. Furthermore, in some cases the state signals are not available anywhere in the analog signal path necessitating additional a...

متن کامل

Tracking performance of incremental LMS algorithm over adaptive distributed sensor networks

in this paper we focus on the tracking performance of incremental adaptive LMS algorithm in an adaptive network. For this reason we consider the unknown weight vector to be a time varying sequence. First we analyze the performance of network in tracking a time varying weight vector and then we explain the estimation of Rayleigh fading channel through a random walk model. Closed form relations a...

متن کامل

An Analytical Model for Predicting the Convergence Behavior of the Least Mean Mixed-Norm (LMMN) Algorithm

The Least Mean Mixed-Norm (LMMN) algorithm is a stochastic gradient-based algorithm whose objective is to minimum a combination of the cost functions of the Least Mean Square (LMS) and Least Mean Fourth (LMF) algorithms. This algorithm has inherited many properties and advantages of the LMS and LMF algorithms and mitigated their weaknesses in some ways. The main issue of the LMMN algorithm is t...

متن کامل

Performance characteristics of the median LMS adaptive filter

The performance of gradient search adaptive filters, such as the least mean squares (LMS) algorithm, may degrade badly when the filter is subjected to input signals which are corrupted by impulsive interference. The median LMS (MLMS) adaptive filter is designed to alleviate this problem by protecting the filter coefficients from the impact of the impulses. MLMS is a modification of LMS, obtaine...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1605.02877  شماره 

صفحات  -

تاریخ انتشار 2016